Quantum Regularized Least Squares
نویسندگان
چکیده
Linear regression is a widely used technique to fit linear models and finds widespread applications across different areas such as machine learning statistics. In most real-world scenarios, however, problems are often ill-posed or the underlying model suffers from overfitting, leading erroneous trivial solutions. This dealt with by adding extra constraints, known regularization. this paper, we use frameworks of block-encoding quantum singular value transformation (QSVT) design first algorithms for least squares general $\ell_2$-regularization. These include regularized versions ordinary squares, weighted generalized squares. Our substantially improve upon prior results on ridge (polynomial improvement in condition number an exponential accuracy), which particular case our result. To end, assume approximate block-encodings matrices input robust QSVT various algebra operations. particular, develop variable-time algorithm matrix inversion using QSVT, where eigenvalue discrimination subroutine instead gapped phase estimation. ensures that fewer ancilla qubits required procedure than results. Owing generality framework, applicable variety can also be seen improved standard (non-regularized) algorithms.
منابع مشابه
Regularized Least-Squares Classification
We consider the solution of binary classification problems via Tikhonov regularization in a Reproducing Kernel Hilbert Space using the square loss, and denote the resulting algorithm Regularized Least-Squares Classification (RLSC). We sketch the historical developments that led to this algorithm, and demonstrate empirically that its performance is equivalent to that of the well-known Support Ve...
متن کاملRegularized Least Squares
{(xi, yi)}i=1 are the given data points, V (·) represents the loss function indicating the penalty we pay for predicting f(xi) when the true value is yi, and ‖f‖H is a Hilbert space norm regularization term with a regularization parameter λ which controls the stability of the solution and trades-off regression accuracy for a function with small norm in RKHS H. Denote by S the training set {(x1,...
متن کاملRLScore: Regularized Least-Squares Learners
RLScore is a Python open source module for kernel based machine learning. The library provides implementations of several regularized least-squares (RLS) type of learners. RLS methods for regression and classification, ranking, greedy feature selection, multi-task and zero-shot learning, and unsupervised classification are included. Matrix algebra based computational short-cuts are used to ensu...
متن کاملDiscriminatively regularized least-squares classification
Over the past decades, regularization theory is widely applied in various areas of machine learning to derive a large family of novel algorithms. Traditionally, regularization focuses on smoothing only, and does not fully utilize the underlying discriminative knowledge which is vital for classification. In this paper, we propose a novel regularization algorithm in the least-squares sense, calle...
متن کاملEfficient AUC Maximization with Regularized Least-Squares
Area under the receiver operating characteristics curve (AUC) is a popular measure for evaluating the quality of binary classifiers, and intuitively, machine learning algorithms that maximize an approximation of AUC should have a good AUC performance when classifying new examples. However, designing such algorithms in the framework of kernel methods has proven to be challenging. In this paper, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Quantum
سال: 2023
ISSN: ['2521-327X']
DOI: https://doi.org/10.22331/q-2023-04-27-988